Sharp Threshold for Multivariate Multi-Response Linear Regression via Block Regularized Lasso

نویسندگان

  • Weiguang Wang
  • Yingbin Liang
  • Eric P. Xing
چکیده

for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) is recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions on sample complexity are characterized as a sharp threshold to guarantee successful recovery of the support union. This model has been previously studied via l1/l∞regularized Lasso by Negahban & Wainwright (2011) and via l1/l1 + l1/l∞-regularized Lasso by Jalali et al. (2010), in which sharp threshold on sample complexity is characterized only for K = 2 and under special conditions. In this work, using l1/l2-regularized Lasso, sharp threshold on sample complexity is characterized under only standard regularization conditions. Namely, if n > cp1ψ(B,Σ) log(p − s) where cp1 is a constant, and s is the size of the support set, then l1/l2-regularized Lasso correctly recovers the support union; and if n < cp2ψ(B ,Σ) log(p − s) where cp2 is a constant, then l1/l2-regularized Lasso fails to recover the support union. In particular, the function ψ(B,Σ) captures the impact of the sparsity of K regression vectors and the statistical properties of the design matrices on the threshold on sample complexity. Therefore, such threshold function also demonstrates the advantages of joint support union recovery using multi-task Lasso over individual support recovery using single-task Lasso.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Block Regularized Lasso for Multivariate Multi-Response Linear Regression

The multivariate multi-response (MVMR) linear regression problem is investigated, in which design matrices are Gaussian with covariance matrices Σ = ( Σ, . . . ,Σ ) for K linear regressions. The support union of K p-dimensional regression vectors (collected as columns of matrix B∗) are recovered using l1/l2-regularized Lasso. Sufficient and necessary conditions to guarantee successful recovery ...

متن کامل

On the Q-linear Convergence of a Majorized Proximal ADMM for Convex Composite Programming and Its Applications to Regularized Logistic Regression

This paper aims to study the convergence rate of a majorized alternating direction method of multiplier with indefinite proximal terms (iPADMM) for solving linearly constrained convex composite optimization problems. We establish the Q-linear rate convergence theorem for 2-block majorized iPADMM under mild conditions. Based on this result, the convergence rate analysis of symmetric Gaussian-Sei...

متن کامل

Support Union Recovery in High - Dimensional Multivariate Regression

In multivariate regression, a K-dimensional response vector is regressed upon a common set of p covariates, with a matrix B∗ ∈ Rp×K of regression coefficients. We study the behavior of the multivariate group Lasso, in which block regularization based on the `1/`2 norm is used for support union recovery, or recovery of the set of s rows for which B∗ is non-zero. Under high-dimensional scaling, w...

متن کامل

Estimation and Selection via Absolute Penalized Convex Minimization And Its Multistage Adaptive Applications

The ℓ1-penalized method, or the Lasso, has emerged as an important tool for the analysis of large data sets. Many important results have been obtained for the Lasso in linear regression which have led to a deeper understanding of high-dimensional statistical problems. In this article, we consider a class of weighted ℓ1-penalized estimators for convex loss functions of a general form, including ...

متن کامل

Simultaneous support recovery in high dimensions: Benefits and perils of block $\ell_1/\ell_\infty$-regularization

Given a collection of r ≥ 2 linear regression problems in p dimensions, suppose that the regression coefficients share partially common supports. This set-up suggests the use of l1/l∞-regularized regression for joint estimation of the p × r matrix of regression coefficients. We analyze the high-dimensional scaling of l1/l∞-regularized quadratic programming, considering both consistency rates in...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1307.7993  شماره 

صفحات  -

تاریخ انتشار 2013